Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
J-SGPGN: paraphrase generation network based on joint learning of sequence and graph
Zhirong HOU, Xiaodong FAN, Hua ZHANG, Xiaonan MA
Journal of Computer Applications    2023, 43 (5): 1365-1371.   DOI: 10.11772/j.issn.1001-9081.2022040626
Abstract229)   HTML8)    PDF (951KB)(139)       Save

Paraphrase generation is a text data argumentation method based on Natural Language Generation (NLG). Concerning the problems of repetitive generation, semantic errors and poor diversity in paraphrase generation methods based on the Sequence-to-Sequence (Seq2Seq) framework, a Paraphrase Generation Network based on Joint learning of Sequence and Graph (J-SGPGN) was proposed. Graph encoding and sequence encoding were fused in the encoder of J-SGPGN for feature enhancement, and two decoding methods including sequence generation and graph generation were designed in the decoder of J-SGPGN for parallel decoding. Then the joint learning method was used to train the model, aiming to combine syntactic supervision with semantic supervision to simultaneously improve the accuracy and diversity of generation. Experimental results on Quora dataset show that the generation accuracy evaluation indicator METEOR (Metric for Evaluation of Translation with Explicit ORdering) of J-SGPGN is 3.44 percentage points higher than that of the baseline model with optimal accuracy — RNN+GCN, and the generation diversity evaluation indicator Self-BLEU (Self-BiLingual Evaluation Understudy) of J-SGPGN is 12.79 percentage points lower than that of the baseline model with optimal diversity — Back-Translation guided multi-round Paraphrase Generation (BTmPG) model. It is verified that J-SGPGN can generate paraphrase text with more accurate semantics and more diverse expressions.

Table and Figures | Reference | Related Articles | Metrics